96 research outputs found

    The effects of visual control and distance in modulating peripersonal spatial representation

    Get PDF
    In the presence of vision, finalized motor acts can trigger spatial remapping, i.e., reference frames transformations to allow for a better interaction with targets. However, it is yet unclear how the peripersonal space is encoded and remapped depending on the availability of visual feedback and on the target position within the individual’s reachable space, and which cerebral areas subserve such processes. Here, functional magnetic resonance imaging (fMRI) was used to examine neural activity while healthy young participants performed reach-to-grasp movements with and without visual feedback and at different distances of the target from the effector (near to the hand–about 15 cm from the starting position–vs. far from the hand–about 30 cm from the starting position). Brain response in the superior parietal lobule bilaterally, in the right dorsal premotor cortex, and in the anterior part of the right inferior parietal lobule was significantly greater during visually-guided grasping of targets located at the far distance compared to grasping of targets located near to the hand. In the absence of visual feedback, the inferior parietal lobule exhibited a greater activity during grasping of targets at the near compared to the far distance. Results suggest that in the presence of visual feedback, a visuo-motor circuit integrates visuo-motor information when targets are located farther away. Conversely in the absence of visual feedback, encoding of space may demand multisensory remapping processes, even in the case of more proximal targets

    Number magnitude to finger mapping is disembodied and topological

    Get PDF
    It has been shown that humans associate fingers with numbers because finger counting strategies interact with numerical judgements. At the same time, there is evidence that there is a relation between number magnitude and space as small to large numbers seem to be represented from left to right. In the present study, we investigated whether number magnitude to finger mapping is embodied (related to the order of fingers on the hand) or disembodied (spatial). We let healthy human volunteers name random numbers between 1 and 30, while simultaneously tapping a random finger. Either the hands were placed directly next to each other, 30 cm apart, or the hands were crossed such that the left hand was on the right side of the body mid-line. The results show that naming a smaller number than the previous one was associated with tapping a finger to the left of the previously tapped finger. This shows that there is a spatial (disembodied) mapping between number magnitude and fingers. Furthermore, we show that this mapping is topological rather than metrically scaled

    Manipulable Objects Facilitate Cross-Modal Integration in Peripersonal Space

    Get PDF
    Previous studies have shown that tool use often modifies one's peripersonal space – i.e. the space directly surrounding our body. Given our profound experience with manipulable objects (e.g. a toothbrush, a comb or a teapot) in the present study we hypothesized that the observation of pictures representing manipulable objects would result in a remapping of peripersonal space as well. Subjects were required to report the location of vibrotactile stimuli delivered to the right hand, while ignoring visual distractors superimposed on pictures representing everyday objects. Pictures could represent objects that were of high manipulability (e.g. a cell phone), medium manipulability (e.g. a soap dispenser) and low manipulability (e.g. a computer screen). In the first experiment, when subjects attended to the action associated with the objects, a strong cross-modal congruency effect (CCE) was observed for pictures representing medium and high manipulability objects, reflected in faster reaction times if the vibrotactile stimulus and the visual distractor were in the same location, whereas no CCE was observed for low manipulability objects. This finding was replicated in a second experiment in which subjects attended to the visual properties of the objects. These findings suggest that the observation of manipulable objects facilitates cross-modal integration in peripersonal space

    Listening to a conversation with aggressive content expands the interpersonal space

    Get PDF
    The distance individuals maintain between themselves and others can be defined as ‘interpersonal space’. This distance can be modulated both by situational factors and individual characteristics. Here we investigated the influence that the interpretation of other people interaction, in which one is not directly involved, may have on a person’s interpersonal space. In the current study we measured, for the first time, whether the size of interpersonal space changes after listening to other people conversations with neutral or aggressive content. The results showed that the interpersonal space expands after listening to a conversation with aggressive content relative to a conversation with a neutral content. This finding suggests that participants tend to distance themselves from an aggressive confrontation even if they are not involved in it. These results are in line with the view of the interpersonal space as a safety zone surrounding one’s body

    Fix Your Eyes in the Space You Could Reach: Neurons in the Macaque Medial Parietal Cortex Prefer Gaze Positions in Peripersonal Space

    Get PDF
    Interacting in the peripersonal space requires coordinated arm and eye movements to visual targets in depth. In primates, the medial posterior parietal cortex (PPC) represents a crucial node in the process of visual-to-motor signal transformations. The medial PPC area V6A is a key region engaged in the control of these processes because it jointly processes visual information, eye position and arm movement related signals. However, to date, there is no evidence in the medial PPC of spatial encoding in three dimensions. Here, using single neuron recordings in behaving macaques, we studied the neural signals related to binocular eye position in a task that required the monkeys to perform saccades and fixate targets at different locations in peripersonal and extrapersonal space. A significant proportion of neurons were modulated by both gaze direction and depth, i.e., by the location of the foveated target in 3D space. The population activity of these neurons displayed a strong preference for peripersonal space in a time interval around the saccade that preceded fixation and during fixation as well. This preference for targets within reaching distance during both target capturing and fixation suggests that binocular eye position signals are implemented functionally in V6A to support its role in reaching and grasping

    Rubber Hands Feel Touch, but Not in Blind Individuals

    Get PDF
    Psychology and neuroscience have a long-standing tradition of studying blind individuals to investigate how visual experience shapes perception of the external world. Here, we study how blind people experience their own body by exposing them to a multisensory body illusion: the somatic rubber hand illusion. In this illusion, healthy blindfolded participants experience that they are touching their own right hand with their left index finger, when in fact they are touching a rubber hand with their left index finger while the experimenter touches their right hand in a synchronized manner (Ehrsson et al. 2005). We compared the strength of this illusion in a group of blind individuals (n = 10), all of whom had experienced severe visual impairment or complete blindness from birth, and a group of age-matched blindfolded sighted participants (n = 12). The illusion was quantified subjectively using questionnaires and behaviorally by asking participants to point to the felt location of the right hand. The results showed that the sighted participants experienced a strong illusion, whereas the blind participants experienced no illusion at all, a difference that was evident in both tests employed. A further experiment testing the participants' basic ability to localize the right hand in space without vision (proprioception) revealed no difference between the two groups. Taken together, these results suggest that blind individuals with impaired visual development have a more veridical percept of self-touch and a less flexible and dynamic representation of their own body in space compared to sighted individuals. We speculate that the multisensory brain systems that re-map somatosensory signals onto external reference frames are less developed in blind individuals and therefore do not allow efficient fusion of tactile and proprioceptive signals from the two upper limbs into a single illusory experience of self-touch as in sighted individuals

    Touch perception reveals the dominance of spatial over digital representation of numbers

    Get PDF
    We learn counting on our fingers, and the digital representation of numbers we develop is still present in adulthood [Andres M, et a. (2007) J Cognit Neurosci 19:563-576]. Such an anatomy-magnitude association establishes tight functional correspondences between fingers and numbers [Di Luca S, et al. (2006) Q J Exp Psychol 59:16481663]. However, it has long been known that small-to-large magnitude information is arranged left-to-right along a mental number line [Dehaene S, et A (1993) J Exp Psychol Genet 122:371-396]. Here, we investigated touch perception to disambiguate whether number representation is embodied on the hand ("1" = thumb; "5" = little finger) or disembodied in the extrapersonal space ("1" = left, "5" = right). We directly contrasted these number representations in two experiments using a single centrally located effector (the foot) and a simple postural manipulation of the hand (palm-up vs. palm-down). We show that visual presentation of a number ("1" or "5") shifts attention cross-modally, modulating the detection of tactile stimuli delivered on the little finger or thumb. With the hand resting palm-down, subjects perform better when reporting tactile stimuli delivered to the little finger after presentation of number "5" than number "1." Crucially, this pattern reverses (better performance after number "1" than "5") when the hand is in a palm-up posture, in which the position of the fingers in external space, but not their relative anatomical position, is reversed. The human brain can thus use either space- or body-based representation of numbers, but in case of competition, the former dominates the latter, showing the stronger role played by the mental number line organization

    Action ability modulates time‑to‑collision judgments

    Get PDF
    Time-to-collision (TTC) underestimation has been interpreted as an adaptive response that allows observers to have more time to engage in a defensive behaviour. This bias seems, therefore, strongly linked to action preparation. There is evidence that the observer’s physical fitness modulates the underestimation effect so that people who need more time to react (i.e. those with less physical fitness) show a stronger underestimation effect. Here we investigated whether this bias is influenced by the momentary action capability of the observers. In the first experiment, participants estimated the time-to-collision of threatening or non-threatening stimuli while being mildly immobilized (with a chin rest) or while standing freely. Having reduced the possibility of movement led participants to show more underestimation of the approaching stimuli. However, this effect was not stronger for threatening relative to non-threatening stimuli. The effect of the action capability found in the first experiment could be interpreted as an expansion of peripersonal space (PPS). In the second experiment, we thus investigated the generality of this effect using an established paradigm to measure the size of peripersonal space. Participants bisected lines from different distances while in the chin rest or standing freely. The results replicated the classic left-to-right gradient in lateral spatial attention with increasing viewing distance, but no effect of immobilization was found. The manipulation of the momentary action capability of the observers influenced the participants’ performance in the TTC task but not in the line bisection task. These results are discussed in relation to the different functions of PPS

    The cognitive neuroscience of prehension: recent developments

    Get PDF
    Prehension, the capacity to reach and grasp, is the key behavior that allows humans to change their environment. It continues to serve as a remarkable experimental test case for probing the cognitive architecture of goal-oriented action. This review focuses on recent experimental evidence that enhances or modifies how we might conceptualize the neural substrates of prehension. Emphasis is placed on studies that consider how precision grasps are selected and transformed into motor commands. Then, the mechanisms that extract action relevant information from vision and touch are considered. These include consideration of how parallel perceptual networks within parietal cortex, along with the ventral stream, are connected and share information to achieve common motor goals. On-line control of grasping action is discussed within a state estimation framework. The review ends with a consideration about how prehension fits within larger action repertoires that solve more complex goals and the possible cortical architectures needed to organize these actions
    corecore